Using Low-Rank Representation of Abundance Maps and Nonnegative Tensor Factorization for Hyperspectral Nonlinear Unmixing

نویسندگان

چکیده

Tensor-based methods have been widely studied to attack inverse problems in hyperspectral imaging since a image (HSI) cube can be naturally represented as third-order tensor, which perfectly retain the spatial information image. In this article, we extend linear tensor method nonlinear and propose low-rank unmixing algorithm solve generalized bilinear model (GBM). Specifically, parts of GBM both expressed tensors. Furthermore, structures abundance maps interaction are exploited by minimizing their nuclear norm, thus taking full advantage high correlation HSIs. Synthetic real-data experiments show that low rank our improve performance unmixing. A MATLAB demo work will available at https://github.com/LinaZhuang for sake reproducibility.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Low-rank Tensor Regularization Strategy for Hyperspectral Unmixing

Tensor-based methods have recently emerged as a more natural and effective formulation to address many problems in hyperspectral imaging. In hyperspectral unmixing (HU), low-rank constraints on the abundance maps have been shown to act as a regularization which adequately accounts for the multidimensional structure of the underlying signal. However, imposing a strict low-rank constraint for the...

متن کامل

Nonnegative Matrix Factorization With Data-Guided Constraints For Hyperspectral Unmixing

Abstract: Hyperspectral unmixing aims to estimate a set of endmembers and corresponding abundances in pixels. Nonnegative matrix factorization (NMF) and its extensions with various constraints have been widely applied to hyperspectral unmixing. L1/2 and L2 regularizers can be added to NMF to enforce sparseness and evenness, respectively. In practice, a region in a hyperspectral image may posses...

متن کامل

Image representation using Laplacian regularized nonnegative tensor factorization

Tensor provides a better representation for image space by avoiding information loss in vectorization. Nonnegative tensor factorization (NTF), whose objective is to express an n-way tensor as a sum of k rank-1 tensors under nonnegative constraints, has recently attracted a lot of attentions for its efficient and meaningful representation. However, NTF only sees Euclidean structures in data spac...

متن کامل

Algorithms for Nonnegative Tensor Factorization

Nonnegative Matrix Factorization (NMF) is an efficient technique to approximate a large matrix containing only nonnegative elements as a product of two nonnegative matrices of significantly smaller size. The guaranteed nonnegativity of the factors is a distinctive property that other widely used matrix factorization methods do not have. Matrices can also be seen as second-order tensors. For som...

متن کامل

Tensor completion using total variation and low-rank matrix factorization

In this paper, we study the problem of recovering a tensor with missing data. We propose a new model combining the total variation regularization and low-rank matrix factorization. A block coordinate decent (BCD) algorithm is developed to efficiently solve the proposed optimization model. We theoretically show that under some mild conditions, the algorithm converges to the coordinatewise minimi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Geoscience and Remote Sensing

سال: 2022

ISSN: ['0196-2892', '1558-0644']

DOI: https://doi.org/10.1109/tgrs.2021.3065990